AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Optimal computational training

# Optimal computational training

Cerebras GPT 590M
Apache-2.0
Cerebras-GPT 590M is a language model based on the Transformer architecture and belongs to the Cerebras-GPT model family. It aims to study the scaling laws of large language models and demonstrate the simplicity and scalability of training large language models on the Cerebras software and hardware stack.
Large Language Model Transformers English
C
cerebras
2,430
21
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase